On Fast Deep Nets for AGI Vision

نویسندگان

  • Jürgen Schmidhuber
  • Dan C. Ciresan
  • Ueli Meier
  • Jonathan Masci
  • Alex Graves
چکیده

Artificial General Intelligence will not be general without computer vision. Biologically inspired adaptive vision models have started to outperform traditional pre-programmed methods: our fast deep / recurrent neural networks recently collected a string of 1st ranks in many important visual pattern recognition benchmarks: IJCNN traffic sign competition, NORB, CIFAR10, MNIST, three ICDAR handwriting competitions. We greatly profit from recent advances in computing hardware, complementing recent progress in the AGI theory of mathematically optimal universal problem solvers. keywords: AGI, Fast Deep Neural Nets, Computer Vision, Hardware Advances vs Theoretical Progress.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Taxonomy of Deep Convolutional Neural Nets for Computer Vision

Traditional architectures for solving computer vision problems and the degree of success they enjoyed have been heavily reliant on hand-crafted features. However, of late, deep learning techniques have offered a compelling alternative – that of automatically learning problem-specific features. With this new paradigm, every problem in computer vision is now being re-examined from a deep learning...

متن کامل

Fast algorithms for learning deep neural networks

With the increase in computation power and data availability in recent times, machine learning and statistics have seen an enormous development and widespread application in areas such as computer vision, computational biology and others. A focus of current research are deep neural nets: nested functions consisting of a hierarchy of layers of thousands of weights and nonlinear, hidden units. Th...

متن کامل

Generalization and Expressivity for Deep Nets

Along with the rapid development of deep learning in practice, theoretical explanations for its success become urgent. Generalization and expressivity are two widely used measurements to quantify theoretical behaviors of deep learning. The expressivity focuses on finding functions expressible by deep nets but cannot be approximated by shallow nets with the similar number of neurons. It usually ...

متن کامل

Imprecise Probability as a Linking Mechanism between Deep Learning, Symbolic Cognition and Local Feature Detection in Vision Processing

A novel approach to computer vision is outlined, involving the use of imprecise probabilities to connect a deep learning based hierarchical vision system with both local feature detection based preprocessing and symbolic cognition based guidance. The core notion is to cause the deep learning vision system to utilize imprecise rather than single-point probabilities, and use local feature detecti...

متن کامل

Fast Learning with Noise in Deep Neural Nets

Dropout has been raised as an effective and simple trick [1] to combat overfitting in deep neural nets. The idea is to randomly mask out input and internal units during training. Despite its usefulness, there has been very little and scattered understanding on injecting noise to deep learning architectures’ internal units. In this paper, we study the effect of dropout on both input and hidden l...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011